Friday, June 9, 2023

AI Transformers for Biomedicine

Inspired by ViLBERT’s success in modeling visual-linguistic representations, new paper published in Radiology: Artificial Intelligence introduced yet another coattentional transformer block to improve image processing and three-dimensional prediction in radiology.  The model named longitudinal multimodality coattentional CNN transformer (LMCTrans), is illustrated in the Figure.

Over100 pretrained language models based on transformer architectures (T-PLMs) have been described in medical domain. 

The original transformer (introduced in "Attention is All You Need") was a breakthrough model that showed that attention could be used to effectively learn long-range dependencies in sequences.

Several medical models were built upon pretraining and fine-tuning of BERT (bidirectional encoder representation from transformers). Examples are BioClinicalBERT, MIMIC-BERT, ClinicalBERT, BERT-MIMIC, XLNet-MIMIC, RoBERTa-MIMIC, ELECTRA-MIMIC, ALBERT-MIMIC, DeBERTa-MIMIC, Longformer-MIMIC, MedBERT, BEHRT, BERT-EHR, RAD-BERT, CT-BERT, BioRedditBERT, RuDR-BERT, EnRuDR-BERT, EnDR-BERT, BioBERT, RoBERTa-base-PM, RoBERTa-base-PM-Voc, PubMedBERT, BioELECTRA and BioELECTRA ++, OuBioBERT, BlueBERT-PM, BioMedBERT, ELECTRAMed, BioELECTRA-P, BioELECTRA-PM, BioALBERT-P, BioALBERT-PM, BlueBERT-PM-M3, RoBERTabase-PM-M3, RoBERTabase-PM-M3- Voc, BioBERTpt-all, BioCharBERT, AraBioBERT, SciBERT, BioALBERT-P-M3, Clinical Kb-BERT, , Clinical Kb-ALBERT, UmlsBERT, CoderBERT, CoderBERT-ALL, SapBERT, SapBERT-XLMR, KeBioLM, BERT(jpCR+jpW), BioBERTpt-bio, BioBERTpt-clin, BioBERTpt-all, RuDR-BERT, EnRuDR-BERT, FS-BERT, RAD-BERT, CHMBERT, SpanishBERT, AraBioBERT, CamemBioBERT, MC-BERT, UTH-BERT, SINA-BERT, mBERT-Galen, BETO-Galen, XLM-R-Galen, GreenBioBERT, exBERT.

Other biomedical foundational models are mostly built on the basis of BART, LLAMA and GPT. See references for more. 


REFERENCES

Wang YJ, Qu L, Sheybani ND, Luo X, Wang J, Hawk KE, Theruvath AJ, Gatidis S, Xiao X, Pribnow A, Rubin D, Daldrup-Link HE. AI Transformers for Radiation Dose Reduction in Serial Whole-Body PET Scans. Radiol Artif Intell. 2023 May 3;5(3):e220246. doi: 10.1148/ryai.220246. PMID: 37293349; PMCID: PMC10245181.

Katikapalli Subramanyam Kalyan, Ajit Rajasekharan, Sivanesan Sangeetha, AMMU: A survey of transformer-based biomedical pretrained language models, Journal of Biomedical Informatics, Volume 126, 2022, 103982, ISSN 1532-0464, https://doi.org/10.1016/j.jbi.2021.103982

Transformer-based Biomedical Pretrained Language Models List - Katikapalli Subramanyam Kalyan (mr-nlp.github.io)

Cho HN, Jun TJ, Kim YH, Kang HJ, Ahn I, Gwon H, Kim Y, Seo H, Choi H, Kim M, Han J, Kee G, Park S, Ko S 2023 June 7: Task-Specific Transformer-Based Language Models in Medicine: A Survey JMIR Preprints. 07/06/2023:49724

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I. Attention is all you need. Advances in neural information processing systems. 2017;30.

Shamshad F, Khan S, Zamir SW, Khan MH, Hayat M, Khan FS, Fu H. Transformers in medical imaging: A survey. Medical Image Analysis. 2023 Apr 5:102802.

No comments:

Post a Comment

From Asimov to AI Predicting Human Lives

For decades, storytellers have envisioned worlds where technology holds the key to predicting the future or shaping human destinies. Isaac A...